Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
نویسندگان
چکیده
منابع مشابه
Structured Nonconvex and Nonsmooth Optimization: Algorithms and Iteration Complexity Analysis
Nonconvex optimization problems are frequently encountered in much of statistics, business, science and engineering, but they are not yet widely recognized as a technology. A reason for this relatively low degree of popularity is the lack of a well developed system of theory and algorithms to support the applications, as is the case for its convex counterpart. This paper aims to take one step i...
متن کاملNonsmooth Equations in Optimization Nonconvex Optimization and Its Applications
The titles published in this series are listed at the end of this volume. 1 1 2 4 4 4 5 5 6 6 7 9 Multifunctions and Derivatives Particular Locally Lipschitz Functions and Related Definitions Generalized Jacobians of Locally Lipschitz Functions Pseudo-Smoothness and D°f Piecewise Functions NCP Functions 1.4 Definitions of Regularity Definitions of Lipschitz Properties Regularity Definitions Fun...
متن کاملFast Stochastic Methods for Nonsmooth Nonconvex Optimization
We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonconvex part is smooth and the nonsmooth part is convex. Surprisingly, unlike the smooth case, our knowledge of this fundamental problem is very limited. For example, it is not known whether the proximal stochastic gradient method with constant minibatch converges to a stationary point. To tack...
متن کاملExistence of Solutions for Nonconvex and Nonsmooth Vector Optimization Problems
We consider the weakly efficient solution for a class of nonconvex and nonsmooth vector optimization problems in Banach spaces. We show the equivalence between the nonconvex and nonsmooth vector optimization problem and the vector variational-like inequality involving set-valued mappings. We prove some existence results concerned with the weakly efficient solution for the nonconvex and nonsmoot...
متن کاملGhost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity
We consider, for the first time, general diminishing stepsize methods for nonconvex, constrained optimization problems. We show that by using directions obtained in an SQP-like fashion convergence to generalized stationary points can be proved. In order to do so, we make use of classical penalty functions in an unconventional way. In particular, penalty functions only enter in the theoretical a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Optimization and Applications
سال: 2018
ISSN: 0926-6003,1573-2894
DOI: 10.1007/s10589-018-0034-y